翻訳と辞書
Words near each other
・ Scott Zona
・ Scott Zwizanski
・ Scott's
・ Scott's (restaurant)
・ Scott's - Bell 47
・ Scott's Addition Historic District
・ Scott's Bottom Nature Area
・ Scott's Bungalow
・ Scott's Crossing Road
・ Scott's Food & Pharmacy
・ Scott's Grotto
・ Scott's Gulf
・ Scott's Hut
・ Scott's mouse-eared bat
・ Scott's oriole
Scott's Pi
・ Scott's Porage Oats
・ Scott's Run Nature Preserve
・ Scott's seaside sparrow
・ Scott's sportive lemur
・ Scott's Store
・ Scott's Tots
・ Scott's trick
・ Scott's Valley
・ Scott's View
・ Scott's Woods Historic District
・ Scott, Arkansas
・ Scott, Brown County, Wisconsin
・ Scott, Burnett County, Wisconsin
・ Scott, Columbia County, Wisconsin


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Scott's Pi : ウィキペディア英語版
Scott's Pi
Scott's pi (named after William A. Scott) is a statistic for measuring inter-rater reliability for nominal data in communication studies. Textual entities are annotated with categories by different annotators, and various measures are used to assess the extent of agreement between the annotators, one of which is Scott's pi. Since automatically annotating text is a popular problem in natural language processing, and goal is to get the computer program that is being developed to agree with the humans in the annotations it creates, assessing the extent to which humans agree with each other is important for establishing a reasonable upper limit on computer performance.
Scott's pi is similar to Cohen's kappa in that they improve on simple observed agreement by factoring in the extent of agreement that might be expected by chance. However, in each statistic, the expected agreement is calculated slightly differently. Scott's pi makes the assumption that annotators have the same distribution of responses, which makes Cohen's kappa slightly more informative. Scott's pi is extended to more than two annotators in the form of Fleiss' kappa.
The equation for Scott's pi, as in Cohen's kappa, is:
:\pi = \frac,
However, Pr(e) is calculated using joint proportions. A worked example is given below.
Confusion matrix for two annotators, three categories and 45 items rated (90 ratings for 2 annotators):
To calculate the expected agreement, sum marginals across annotators and divide by the total number of ratings to obtain joint proportions. Square and total these:
To calculate observed agreement, divide the number of items on which annotators agreed by the total number of items. In this case,
:\Pr(a) = \frac = 0.333.
Given that Pr(e) = 0.369, Scott's pi is then
:\pi = \frac = -0.059 .
==See also==

*Cohen's kappa
*Fleiss' kappa
*Krippendorff's alpha

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Scott's Pi」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.